National Repository of Grey Literature 3 records found  Search took 0.01 seconds. 
Least Squares Alternatives
Gerthofer, Michal ; Pešta, Michal (advisor) ; Kulich, Michal (referee)
In the present thesis we deal with the linear regression models based on least squares. These methods are discussed in two groups. The first one focuses on three primary aproaches devided by occurrence of errors in variables. The traditional approach penalizes only the misfit in the de- pendent variable part and is called the ordinary least squares (OLS). An opposite case to the OLS is represented by the data least squares (DLS), which allow corrections only in the explanatory variables. Consecutively, we concentrate ourselves on the total least squares approach (TLS) mi- nimizing the squares of errors in the values of both dependent and independent variables. Finally, we give attention to next group of methods whit high breakdown point, which deal with signifi- cance of the individual observations (least weighted squares) and elimination of outlying obser- vations (least trimmed squares). The main purpose of this work is to describe and compare these models, their assumptions, characteristics, properties of estimates and show them on real data. 1
A Nonparametric Bootstrap Comparison of Variances of Robust Regression Estimators.
Kalina, Jan ; Tobišková, Nicole ; Tichavský, Jan
While various robust regression estimators are available for the standard linear regression model, performance comparisons of individual robust estimators over real or simulated datasets seem to be still lacking. In general, a reliable robust estimator of regression parameters should be consistent and at the same time should have a relatively small variability, i.e. the variances of individual regression parameters should be small. The aim of this paper is to compare the variability of S-estimators, MM-estimators, least trimmed squares, and least weighted squares estimators. While they all are consistent under general assumptions, the asymptotic covariance matrix of the least weighted squares remains infeasible, because the only available formula for its computation depends on the unknown random errors. Thus, we take resort to a nonparametric bootstrap comparison of variability of different robust regression estimators. It turns out that the best results are obtained either with MM-estimators, or with the least weighted squares with suitable weights. The latter estimator is especially recommendable for small sample sizes.
Least Squares Alternatives
Gerthofer, Michal ; Pešta, Michal (advisor) ; Kulich, Michal (referee)
In the present thesis we deal with the linear regression models based on least squares. These methods are discussed in two groups. The first one focuses on three primary aproaches devided by occurrence of errors in variables. The traditional approach penalizes only the misfit in the de- pendent variable part and is called the ordinary least squares (OLS). An opposite case to the OLS is represented by the data least squares (DLS), which allow corrections only in the explanatory variables. Consecutively, we concentrate ourselves on the total least squares approach (TLS) mi- nimizing the squares of errors in the values of both dependent and independent variables. Finally, we give attention to next group of methods whit high breakdown point, which deal with signifi- cance of the individual observations (least weighted squares) and elimination of outlying obser- vations (least trimmed squares). The main purpose of this work is to describe and compare these models, their assumptions, characteristics, properties of estimates and show them on real data. 1

Interested in being notified about new results for this query?
Subscribe to the RSS feed.